Dynamic reweighting of visual and vestibular cues during self-motion perception.

نویسندگان

  • Christopher R Fetsch
  • Amanda H Turner
  • Gregory C DeAngelis
  • Dora E Angelaki
چکیده

The perception of self-motion direction, or heading, relies on integration of multiple sensory cues, especially from the visual and vestibular systems. However, the reliability of sensory information can vary rapidly and unpredictably, and it remains unclear how the brain integrates multiple sensory signals given this dynamic uncertainty. Human psychophysical studies have shown that observers combine cues by weighting them in proportion to their reliability, consistent with statistically optimal integration schemes derived from Bayesian probability theory. Remarkably, because cue reliability is varied randomly across trials, the perceptual weight assigned to each cue must change from trial to trial. Dynamic cue reweighting has not been examined for combinations of visual and vestibular cues, nor has the Bayesian cue integration approach been applied to laboratory animals, an important step toward understanding the neural basis of cue integration. To address these issues, we tested human and monkey subjects in a heading discrimination task involving visual (optic flow) and vestibular (translational motion) cues. The cues were placed in conflict on a subset of trials, and their relative reliability was varied to assess the weights that subjects gave to each cue in their heading judgments. We found that monkeys can rapidly reweight visual and vestibular cues according to their reliability, the first such demonstration in a nonhuman species. However, some monkeys and humans tended to over-weight vestibular cues, inconsistent with simple predictions of a Bayesian model. Nonetheless, our findings establish a robust model system for studying the neural mechanisms of dynamic cue reweighting in multisensory perception.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Multimodal Integration during Self-Motion in Virtual Reality

When moving through our environment, the human brain must integrate information from our muscles and joints (proprioception), the acceleration detectors in our inner ear (vestibular cues) and dynamic visual information (optic flow). While past research has focused on understanding how each of these modalities can be used to perceive different aspects of self-motion independently, very little is...

متن کامل

Slow changing postural cues cancel visual field dependence on self-tilt detection.

Interindividual differences influence the multisensory integration process involved in spatial perception. Here, we assessed the effect of visual field dependence on self-tilt detection relative to upright, as a function of static vs. slow changing visual or postural cues. To that aim, we manipulated slow rotations (i.e., 0.05° s(-1)) of the body and/or the visual scene in pitch. Participants h...

متن کامل

Visual and Vestibular Perceptual Thresholds Each Demonstrate Better Precision at Specific 1 Frequencies and Also Exhibit Optimal Integration 2 3 Running Title: Visual & Vestibular Perceptual Threshold Dynamics 4 5

42 Prior studies show that visual motion perception is more precise than vestibular motion 43 perception, but it is unclear whether this is universal or the result of specific experimental conditions. 44 We compared visual and vestibular motion precision over a broad range of temporal frequencies by 45 measuring thresholds for vestibular (subject motion in the dark), visual (visual scene motion...

متن کامل

Vertical linear self-motion perception during visual and inertial motion: more than weighted summation of sensory inputs.

We evaluated visual and vestibular contributions to vertical self motion perception by exposing subjects to various combinations of 0.2 Hz vertical linear oscillation and visual scene motion. The visual stimuli presented via a head-mounted display consisted of video recordings of the test chamber from the perspective of the subject seated in the oscillator. In the dark, subjects accurately repo...

متن کامل

Temporoparietal encoding of space and time during vestibular-guided orientation

When we walk in our environment, we readily determine our travelled distance and location using visual cues. In the dark, estimating travelled distance uses a combination of somatosensory and vestibular (i.e., inertial) cues. The observed inability of patients with complete peripheral vestibular failure to update their angular travelled distance during active or passive turns in the dark implie...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • The Journal of neuroscience : the official journal of the Society for Neuroscience

دوره 29 49  شماره 

صفحات  -

تاریخ انتشار 2009